diff options
Diffstat (limited to 'blog/2023-06-20-audit-review-cheatsheet.org')
-rw-r--r-- | blog/2023-06-20-audit-review-cheatsheet.org | 138 |
1 files changed, 72 insertions, 66 deletions
diff --git a/blog/2023-06-20-audit-review-cheatsheet.org b/blog/2023-06-20-audit-review-cheatsheet.org index 37abafa..6d964fa 100644 --- a/blog/2023-06-20-audit-review-cheatsheet.org +++ b/blog/2023-06-20-audit-review-cheatsheet.org @@ -1,75 +1,81 @@ -+++ -date = 2023-06-20 -title = "Cheatsheet: Review Audit Test Results" -description = "A handy cheatsheet for reviewing audit testing on FSA and SOC report engagements." -+++ +#+title: Audit Review Checklist +#+date: 2023-06-20 -## Overview -This post is a *very* brief overview on the basic process to review audit -test results, focusing on work done as part of a financial statement audit -(FSA) or service organization controls (SOC) report. +** Overview +:PROPERTIES: +:CUSTOM_ID: overview +:END: +This post is a /very/ brief overview on the basic process to review +audit test results, focusing on work done as part of a financial +statement audit (FSA) or service organization controls (SOC) report. -While there are numerous different things to review and look for - all varying -wildly depending on the report, client, and tester - this list serves as a solid -base foundation for a reviewer. +While there are numerous different things to review and look for - all +varying wildly depending on the report, client, and tester - this list +serves as a solid base foundation for a reviewer. -I have used this throughout my career as a starting point to my reviews, and it -has worked wonders for creating a consistent and objective template to my -reviews. The goal is to keep this base high-level enough to be used on a wide -variety of engagements, while still ensuring that all key areas are covered. - -## Cheatsheet +I have used this throughout my career as a starting point to my reviews, +and it has worked wonders for creating a consistent and objective +template to my reviews. The goal is to keep this base high-level enough +to be used on a wide variety of engagements, while still ensuring that +all key areas are covered. +** Cheatsheet +:PROPERTIES: +:CUSTOM_ID: cheatsheet +:END: 1. [ ] Check all documents for spelling and grammar. 2. [ ] Ensure all acronyms are fully explained upon first use. -3. [ ] For all people referenced, use their full names and job titles upon -first use. -4. [ ] All supporting documents must cross-reference to the lead sheet and -vice-versa. +3. [ ] For all people referenced, use their full names and job titles + upon first use. +4. [ ] All supporting documents must cross-reference to the lead sheet + and vice-versa. 5. [ ] Verify that the control has been adequately tested: - - [ ] **Test of Design**: Did the tester obtain information regarding how - the control should perform normally and abnormally (e.g., emergency - scenarios)? - - [ ] **Test of Operating Effectiveness**: Did the tester inquire, observe, - inspect, or re-perform sufficient evidence to support their - conclusion over the control? Inquiry alone is not adequate! -6. [ ] For any information used in the control, whether by the control operator -or by the tester, did the tester appropriately document the source (system or -person), extraction method, parameters, and completeness and accuracy (C&A)? - - [ ] For any reports, queries, etc. used in the extraction, did the tester - include a copy and notate C&A considerations? -7. [ ] Did the tester document the specific criteria that the control is being -tested against? -8. [ ] Did the tester notate in the supporting documents where each criterion - was satisfied? -9. [ ] If testing specific policies or procedures, are the documents adequate? - - [ ] e.g., a test to validate that a review of policy XYZ occurs - periodically should also evaluate the sufficiency of the policy itself, if - meant to cover the risk that such a policy does not exist and is not - reviewed. + - [ ] *Test of Design*: Did the tester obtain information regarding + how the control should perform normally and abnormally (e.g., + emergency scenarios)? + - [ ] *Test of Operating Effectiveness*: Did the tester inquire, + observe, inspect, or re-perform sufficient evidence to support + their conclusion over the control? Inquiry alone is not adequate! +6. [ ] For any information used in the control, whether by the control + operator or by the tester, did the tester appropriately document the + source (system or person), extraction method, parameters, and + completeness and accuracy (C&A)? + - [ ] For any reports, queries, etc. used in the extraction, did the + tester include a copy and notate C&A considerations? +7. [ ] Did the tester document the specific criteria that the control is + being tested against? +8. [ ] Did the tester notate in the supporting documents where each + criterion was satisfied? +9. [ ] If testing specific policies or procedures, are the documents + adequate? + - [ ] e.g., a test to validate that a review of policy XYZ occurs + periodically should also evaluate the sufficiency of the policy + itself, if meant to cover the risk that such a policy does not + exist and is not reviewed. 10. [ ] Does the test cover the appropriate period under review? - - [ ] If the test is meant to cover only a portion of the audit period, do - other controls exist to mitigate the risks that exist for the remainder of - the period? -11. [ ] For any computer-aided audit tools (CAATs) or other automation -techniques used in the test, is the use of such tools explained and -appropriately documented? -12. [ ] If prior-period documentation exists, are there any missing pieces of -evidence that would further enhance the quality of the test? -13. [ ] Was any information discovered during the walkthrough or inquiry phase -that was not incorporated into the test? -14. [ ] Are there new rules or expectations from your company's internal -guidance or your regulatory bodies that would affect the audit approach for this -control? -15. [ ] Was an exception, finding, or deficiency identified as a result of this -test? - - [ ] Was the control deficient in design, operation, or both? - - [ ] What was the root cause of the finding? - - [ ] Does the finding indicate other findings or potential fraud? - - [ ] What's the severity and scope of the finding? - - [ ] Do other controls exist as a form of compensation against the - finding's severity, and do they mitigate the risk within the control - objective? - - [ ] Does the finding exist at the end of the period, or was it resolved - within the audit period?
\ No newline at end of file + - [ ] If the test is meant to cover only a portion of the audit + period, do other controls exist to mitigate the risks that exist + for the remainder of the period? +11. [ ] For any computer-aided audit tools (CAATs) or other automation + techniques used in the test, is the use of such tools explained and + appropriately documented? +12. [ ] If prior-period documentation exists, are there any missing + pieces of evidence that would further enhance the quality of the + test? +13. [ ] Was any information discovered during the walkthrough or inquiry + phase that was not incorporated into the test? +14. [ ] Are there new rules or expectations from your company's internal + guidance or your regulatory bodies that would affect the audit + approach for this control? +15. [ ] Was an exception, finding, or deficiency identified as a result + of this test? + - [ ] Was the control deficient in design, operation, or both? + - [ ] What was the root cause of the finding? + - [ ] Does the finding indicate other findings or potential fraud? + - [ ] What's the severity and scope of the finding? + - [ ] Do other controls exist as a form of compensation against the + finding's severity, and do they mitigate the risk within the + control objective? + - [ ] Does the finding exist at the end of the period, or was it + resolved within the audit period? |