🧠 Super Memory Trick

  • 🔥 Smoke = Build is not broken
  • 🔥 Sanity = Fix is logical
  • 🔥 Regression = Nothing else broke

Software testing is classified/divided into in 3 main ways:

  • HOW → Testing Techniques (White/Black/Grey Box)
  • WHERE → Testing Levels (Unit → Integration → System → UAT → Release)
  • WHAT → Testing Types (Smoke, Sanity, Regression, Performance, Security, etc.)
  • WHEN CHANGE → Regression/Retest

✅ Complete Software Testing Summary Table (One Page Revision)

Testing TypeLevel / CategoryMain Purpose (Key Idea)Who PerformsWhen DoneExample
White Box TestingTechniqueTests internal code logicDevelopersDuring codingTest all branches in if/else
Black Box TestingTechniqueTests input/output behaviorQA Testers, UsersAfter build readyLogin with valid/invalid credentials
Unit TestingLevelTests individual function/moduleDevelopersFirst testing stageTest add() function
Integration TestingLevelTests interaction between modulesDev + QAAfter Unit TestingLogin module → Dashboard module
System TestingLevelTests complete system end-to-endQA TeamAfter Integration TestingFull e-commerce flow checkout
Acceptance Testing (UAT)LevelConfirms system meets business needsClients/End UsersAfter System TestingUser verifies banking transfer works
Functional TestingType (Black Box)Checks features work as requiredQA TestersAny testing levelLogin, Signup, Payment works
Regression TestingTypeEnsures new changes didn’t break old featuresQA + AutomationAfter bug fix / new featurePassword reset fix → Login still works
Smoke TestingTypeChecks build stability (basic critical functions)QA / DevAfter new build deploymentApp opens, Login works
Sanity TestingTypeChecks specific fix/change works correctlyQA TestersAfter small bug fixDiscount bug fixed → verify coupon works
RetestingTypeConfirms a specific bug is fixedQA TestersAfter defect fixTest same failed case again

⭐ Quick Interview Comparison Table

FeatureSmokeSanityRegression
ScopeBroad & basicNarrow & focusedWide & deep
Done AfterNew buildSmall change/bug fixAny code change
GoalBuild is stable?Fix works correctly?Old features still work?
Test DepthShallowMediumDeep

Software testing is divided: in 3 main ways

1. Testing Techniques/ Approaches (HOW you test): 3 techniques

Describe how test cases are designed.

  • White Box Testing (Tester knows internal code/logic [all internal code paths, conditions, and loops])[Structure-Based]:
    • Example: Verify all if–else branches in the login code execute correctly for valid and invalid credentials.
  • Black Box Testing (Tester does not know code, only input/output) [Specification-Based]
    • Example: Enter valid and invalid username/password combinations and verify the login success or error message.
  • Grey Box Testing (Tester has partial knowledge of the system) [Test system behavior using partial knowledge of internal logic such as APIs, databases, or architecture.]
    • Example: Submit login credentials and verify the API response and database session record are created correctly.

2. Testing Levels (WHERE you test): 4 levels

Describe at what stage or module level testing happens.

Unit → Integration → System → UAT → Release

Smoke → Sanity → Regression (common in Agile)

  • Unit Testing (Tests individual function/module)
  • Integration Testing (Tests interaction between combined modules)
    • Types:
      • Big Bang Integration
      • Incremental Integration (Top-down, Bottom-up, Sandwich)
  • System Testing (Tests complete application end-to-end) [Functional + Non-functional validation]
  • Acceptance Testing (UAT) (Tests if system is ready for business use - Business/user approval)
    • Types:
      • User Acceptance Testing (UAT): End users verify the system meets their day-to-day business needs before release.
      • Business Acceptance Testing (BAT): Business stakeholders validate workflows, rules, and objectives align with business requirements.
      • Operational Acceptance Testing (OAT): Operations/IT teams ensure the system is ready for production support, monitoring, backup, and recovery.
      • Regulatory Acceptance Testing: Confirms the system complies with legal, industry, or regulatory standards (e.g., HIPAA, PCI-DSS).
      • Alpha Testing: Internal testing performed by the organization before releasing the product to external users.
      • Beta Testing: Limited release to real users in a production-like environment to gather feedback before final launch.

3. Testing Types (WHAT you test):

Describe the purpose or goal of testing.

A. Functional Testing Types (Feature-based): 5 types

Functional Testing (Validates software features work correctly)

Examples: Login testing, Payment testing, API response testing

  • Smoke Testing (Checks build stability after new build)
  • Sanity Testing (Checks specific fix/change works correctly)
  • Regression Testing (Ensures new changes didn’t break old features)
  • Retesting (Confirms a specific bug fix works)

B. Non-Functional Testing Types (Quality-based): 5 types

  • Performance Testing (Speed, load, scalability)
  • Security Testing (Protection from attacks/vulnerabilities)
  • Usability Testing (User friendliness)
  • Compatibility Testing (Works across browsers/devices/OS)
  • Reliability Testing (Stability over time)

Other:

  • Experience-Based Testing: Based on tester’s experience, intuition, and skill
    • No strict reliance on code or documentation
    • Includes techniques like:
      • Exploratory Testing
      • Error Guessing
      • Ad-hoc Testing
    • Example: Trying invalid inputs because similar apps failed there before.

Job Ready:

LinkedIn: Ensures production-ready software in Agile by combining unit, integration, system, regression, UAT, and performance/security testing.

Non Technical: I make sure every new feature works correctly, doesn’t break anything else, and meets user and business needs before release.

One-sentence killer answer: In Agile, every feature is validated end-to-end: developers do unit and white-box testing, QA ensures stability with smoke, sanity, regression, and system testing, while UAT, performance, and security checks confirm it’s production-ready.

15s version: In an Agile environment, every change goes through layered testing.” “Developers validate logic with unit and white-box testing, QA runs smoke and sanity tests on each build, then performs integration and system testing using black- and grey-box approaches. Automated regression ensures existing features remain stable, and UAT plus performance and security checks complete the release before production.

STAR-formatted answer:

Perfect — here’s a senior-level STAR-formatted answer for the money-transfer feature example. You can deliver this in interviews confidently:

S – Situation: “In our online banking app, we were adding a new money-transfer feature that allowed users to send funds to other accounts.”

T – Task: “My goal was to ensure the feature worked correctly, didn’t break existing functionality, and met both user and business requirements before production release.”

A – Action: “As part of the Agile sprint, developers performed unit and white-box testing on core logic, QA conducted smoke, sanity, regression, integration, and system testing using black- and grey-box approaches, and automated regression tests were run for every code change. Finally, UAT, performance, and security checks were performed with stakeholders to validate real-world usage.”

R – Result: “The feature launched smoothly with zero critical defects, all business and regulatory requirements were met, and post-release support tickets dropped by 90% compared to previous releases.”

Automation-focused version (Selenium / API / CI-CD)

In an Agile setup, every money-transfer feature is validated end-to-end using automation. Developers write unit and API tests, while QA implements Selenium UI tests and API tests in CI/CD pipelines. Each build triggers automated smoke, sanity, and regression tests, ensuring critical workflows like login, transfer, and balance updates remain stable. Automated tests are integrated into the pipeline so failures are caught immediately, and UAT, performance, and security checks are conducted before production deployment.


Full Exmaple: “Let me explain using an online banking money-transfer feature.”

“When a new transfer feature is developed, testing starts at different levels and from different perspectives. First, developers perform unit testing and white-box testing to verify internal logic such as balance validation, fee calculation, and error handling.

Next comes integration testing, where we verify interactions between modules like login, transfer service, database, and notification systems.

After that, the QA team performs system testing, validating the complete end-to-end flow — login, beneficiary selection, transfer, and confirmation — mainly using black-box testing.

Before release, UAT is done by business users to confirm the feature meets real banking requirements and regulatory rules.

For testing types, we run smoke testing after each new build to ensure the app is stable, sanity testing after specific bug fixes, retesting to confirm the fix works, and regression testing to ensure existing features like login and balance display aren’t broken.

We also perform non-functional testing, including performance testing for high user load, security testing to prevent unauthorized transfers, usability testing for user friendliness, and compatibility testing across browsers and devices.”