Pre-Review
The pre-review is Step 1 of the TQS certification process, where the project team self-diagnoses specification compliance before requesting a formal audit. This chapter defines the purpose, methodology, gap analysis, remediation planning, and pre-consultation procedures of the pre-review.
31.2.1. Purpose of Pre-Review
The core purpose of the pre-review is to minimize the risk of failing the formal technical audit. If a project receives a fail verdict in the formal audit, it must restart from Step 1 after remediation, requiring a minimum of 3-4 additional weeks. By identifying and remediating non-compliant items in advance through the pre-review, the overall time to certification can be reduced.
Conducting a pre-review provides the following benefits.
- The project's specification compliance status can be objectively assessed before the formal audit.
- Remediation priorities for non-compliant items can be clearly established.
- Team understanding of TQS specifications is enhanced, fostering an internalized culture of specification compliance.
- Unnecessary re-audits are prevented, conserving resources for both the TQS Committee and the project team.
The pre-review is conducted under the responsibility of the project team. The Project Lead bears overall responsibility for the pre-review, and the development team performs the actual assessment work. The duration of the pre-review is 1-2 weeks, depending on project size and specification compliance level.
31.2.2. Self-Assessment
The self-assessment is the process of verifying compliance for each item based on the TQS Certification Checklist (Chapter 32). The self-assessment is performed using a combination of quantitative verification with automated tools and manual verification.
31.2.2.1. Checklist-Based Assessment
The project team must verify compliance for all items in the TQS Certification Checklist (Chapter 32). The following information must be recorded for each item.
| Record Item | Description |
|---|---|
| Item Number | Unique number of the checklist item |
| Compliance Status | Compliant / Partially Compliant / Non-Compliant |
| Supporting Evidence | Screenshots, configuration file paths, or links proving compliance |
| Remarks | Reasons and remediation plans for partially compliant or non-compliant items |
When conducting the checklist assessment, the mandatory/recommended classification of each item must be clearly understood. All mandatory items must be 100% compliant to obtain certification, and the compliance rate of recommended items affects the certification grade (Basic/Advanced/Premier).
31.2.2.2. Automated Tool Execution
Quantitative items in the self-assessment are verified by running automated tools. The execution results of the following tools must be collected and recorded.
| Tool | Verification Target | Execution Command | Criteria |
|---|---|---|---|
| Spotless | Backend code formatting | mvn spotless:check | 0 violations |
| ESLint | Frontend linting | npx eslint . | 0 errors |
| Prettier | Frontend formatting | npx prettier --check . | 0 violations |
| JaCoCo | Backend test coverage | mvn test jacoco:report | Line 80%, Branch 70% |
| Vitest | Frontend test coverage | npx vitest run --coverage | Line 80%, Branch 70% |
| Lighthouse | Frontend performance/accessibility | Chrome DevTools or CLI | Performance score 90+ |
| OWASP Dependency-Check | Dependency security vulnerabilities | mvn dependency-check:check | 0 vulnerabilities with CVSS 7+ |
The execution results of automated tools serve as the foundational data for deliverables submitted during the formal audit. Therefore, it is recommended to save execution results as files and record the execution date/time and environment information together.
31.2.2.3. Identifying Non-Compliant Items
Items classified as non-compliant or partially compliant from the self-assessment results must be separately listed. The non-compliant items list must include the following information.
- Item number and item name
- Mandatory/recommended classification
- Current status (compliance rate or current state description for partially compliant items)
- Reason for non-compliance
- Estimated remediation difficulty (High/Medium/Low)
The non-compliant items list is used as input data for gap analysis and remediation planning.
31.2.3. Gap Analysis
Gap analysis is the process of systematically analyzing the differences between the project's current state and TQS requirements. Based on non-compliant items identified in the self-assessment, the scale, causes, and impact of gaps are analyzed.
31.2.3.1. Comparison Matrix
The first step of gap analysis is to create a comparison matrix of the current state and TQS requirements. The comparison matrix is prepared for each checklist area, organized so that the compliance status of each item can be visually understood.
Compliance status is classified into the following 3 categories.
| Compliance Status | Definition | Action |
|---|---|---|
| Compliant | Fully satisfies TQS requirements | No additional action required |
| Partially Compliant | Partially satisfies TQS requirements | Remediation of non-compliant portions required |
| Non-Compliant | Does not satisfy TQS requirements at all | Full remediation or new implementation required |
31.2.3.2. Gap Analysis Results Example
The following is an example of a gap analysis results table. Project teams should reference this format when preparing the actual gap analysis results for their project.
| Area | Item | TQS Requirement | Current State | Compliance Status | Gap Description |
|---|---|---|---|---|---|
| Code Convention | Spotless Application | Automated format verification during build | spotless-maven-plugin configured | Compliant | --- |
| Testing | Line Coverage | 80% or above | 72% | Partially Compliant | 8 percentage points short; service layer tests need strengthening |
| Security | OWASP Scan | 0 vulnerabilities with CVSS 7+ | Not applied | Non-Compliant | Plugin configuration and initial scan required |
| CI/CD | Security Scan Step | Security scan included in pipeline | Only lint/test/build configured | Non-Compliant | Security scan step needs to be added to CircleCI configuration |
| Frontend | Lighthouse Score | Performance score 90+ | 85 points | Partially Compliant | Image optimization and code splitting improvements needed |
If the number of non-compliant mandatory items exceeds 10% of all mandatory items in the gap analysis results, significant time may be required for remediation, so the audit schedule must be planned carefully.
31.2.4. Remediation Planning
Based on the gap analysis results, a specific remediation plan is developed for non-compliant items. The remediation plan is overseen by the Project Lead and must be developed in consultation with the development team to establish a feasible schedule.
31.2.4.1. Priority Determination
Remediation priorities for non-compliant items are determined according to the following criteria.
| Priority | Criteria | Examples |
|---|---|---|
| 1st | Security-related mandatory items | Spring Security configuration, SQL parameter binding, secret management |
| 2nd | Code quality-related mandatory items | Formatter application, naming conventions, package structure |
| 3rd | Testing/CI-related mandatory items | Test coverage, CI/CD pipeline configuration |
| 4th | Recommended items (for Advanced/Premier grade targets) | OWASP scan, Lighthouse optimization, keyboard navigation |
Mandatory items must be remediated. Remediation of recommended items is determined based on the target certification grade. If targeting only Basic certification, remediation of recommended items is optional.
31.2.4.2. Remediation Schedule
The remediation plan must include the following information for each non-compliant item.
- Remediation work description (specific scope of work)
- Assignee (developer or team performing the work)
- Start date and expected completion date
- Dependencies (sequential relationships with other remediation tasks)
- Verification method (how to confirm remediation completion)
The remediation schedule must be established at a level that is compatible with the project's existing development schedule. Situations where the release schedule is delayed for certification preparation must be minimized.
31.2.4.3. Estimated Duration
The estimated duration for remediation work varies depending on the scale and difficulty of non-compliant items. The following are estimated durations by major remediation work type.
| Remediation Work Type | Estimated Duration | Notes |
|---|---|---|
| Formatter/linter configuration | 1-2 days | Batch application possible after tool setup |
| Naming convention refactoring | 2-5 days | Proportional to project size |
| Test coverage improvement | 3-7 days | Proportional to the deficit |
| CI/CD pipeline configuration | 1-3 days | Varies depending on existing pipeline availability |
| Security configuration | 2-5 days | Varies depending on number and complexity of items |
| Project structure refactoring | 3-10 days | Proportional to project size and current state |
31.2.5. Pre-Consultation
The project team may request an informal pre-consultation from the TQS Committee during the pre-review process. Pre-consultation is used when specification interpretation is unclear or advice on application methods is needed.
31.2.5.1. Consultation Request Method
Pre-consultation is requested informally from the TQS Committee via email or internal communication tools. The following information must be included in the request.
- Project name and contact person
- Query item (checklist number or specification clause)
- Specific query content
- Current application status (including code or configuration examples if possible)
Pre-consultation requests are independent of the audit request, and not having received pre-consultation does not affect the audit request.
31.2.5.2. Consultation Scope
The scope covered in pre-consultation includes the following.
- Interpretation queries on specific TQS specification clauses
- Advice on specification application methods for specific technology stacks
- Confirmation of compliance criteria for checklist items
- Preliminary opinions on self-assessment results
The scope not covered in pre-consultation includes the following.
- Writing specific code on behalf of the project team
- Pre-audit verdicts (predicting pass/fail outcomes)
- Guaranteeing verdict outcomes in the formal audit
31.2.5.3. Effect of Consultation Results
Opinions provided in pre-consultation are for reference only and do not directly affect formal audit results. Even if an opinion of "no issues" is received in pre-consultation, the item may still be judged as non-compliant in the formal audit. This is because pre-consultation is informal in nature, and stricter criteria may be applied in the formal audit.
However, if the advice provided in pre-consultation is diligently implemented, the likelihood of passing the formal audit can be increased. It is recommended that project teams record pre-consultation details and include the results of actions taken based on the advice in their deliverables.