Deze samenvatting omvat de verplichte literatuur van het vak Data Driven Control (DDC). Deze samenvatting bevat de hoofdstukken 1,2,4,5,6,7 en 18 van het boek Accounting Information Systems (Romney & Steinbart). Daarnaast bevat het een samenvatting van de artikelen (Vasarhelyi, Guido L Geerts, Dai ...
Test Bank for Accounting Information Systems, 15th Edition(Marshal Romney, 2020), All Chapters | Complete Guide A+
Summary Accounting Information Systems, Global Edition - Internal Control (EBB048A05)
Test Bank For Accounting Information Systems, 15th Edition by Romney, Steinbart, Summers, Wood all chapters.
All for this textbook (49)
Written for
Rijksuniversiteit Groningen (RuG)
Master Accountancy & Controlling
Data Driven Control (EBM154B05)
All documents for this subject (2)
4
reviews
By: alwin55nl • 2 year ago
By: hkad1 • 2 year ago
By: patrickb1 • 2 year ago
By: axellebork • 2 year ago
Seller
Follow
bajelsma
Reviews received
Content preview
Data Driven Control Samenvatting
Inhoudsopgave
Week 1 – Introduction.................................................................................................................2
Romney Chapter 5 – Introduction to Data Analytics in Accounting......................................2
Paper 1 – Week 1 – Vasarhelyi – Big data in accounting.......................................................6
Paper 2 – Week 1 – Guido – A design science research.........................................................9
Week 2 – Vision of the future...................................................................................................11
Paper 3 – Week 2 – Dai – Toward blockchain-based accounting.........................................13
Romney Chapter 6 – Transforming data...............................................................................20
Romney Chapter 7 – Data analysis and presentation............................................................23
Week 3 – Data as an instrument of assurance...........................................................................26
Paper 4 – Week 3 – Kuhn – Continuous auditing in ERP system........................................29
Paper 5 – Week 3 – Singh – Continuous auditing and continuous monitoring....................33
Week 4 – Data as an object of assurance..................................................................................35
Paper 6 – Week 4 – Appelbaum – Impact of business analytics..........................................36
Romney Chapter 1 – Accounting information systems: An overview.................................40
Romney Chapter 2 – Overview of transaction processing and ERP systems.......................43
Week 5 – Technical aspects of data..........................................................................................44
Romney Chapter 4 – Relational databases............................................................................45
Paper 7 – Week 5 – Gray – A taxonomy to guide research..................................................47
Paper 8 – Week 5 – Baader – Reducing false positives in fraud detection...........................51
Week 6 – Standardization of data.............................................................................................54
Paper 9 – Week 6 – Vasarhelyi – Consequences of XBRL standardization.........................55
Romney Chapter 18 – General ledger and Reporting System..............................................58
Week 7 – Standardization of data processing...........................................................................60
Paper 10 – Week 7 – Aalst – RPA........................................................................................61
Paper 11 – Week 7 – Huang – Applying RPA in auditing...................................................63
,Week 1 – Introduction
Literature:
- Paper 1 – Vasarhelyi – Big data in accounting
- Paper 2 – Guido – A design science research
- Romney – Chapter 5 – Introduction to Data Analytics in Accounting
PowerPoint:
An accountant needs to have the following skills: analytical mindset, innovative mindset,
embraces challenges, curiosity, technology knowledge.
Moore’s law: States that we can expect the speed and capability of our computers to increase
every couple of years, and we will pay less for them.
Digital disruption: Is the primary catalyst of change. There is more and more digital
disruption nowadays. No industry is free from digital disruption and there are many
opportunities regarding technology. So digital disruption is used by business to move and
evolve at the pace of consumers and markets.
Data + context + understanding = value
Value of data is dependent on several drivers: Availability, quality, relevancy, timeliness,
reliability.
Data mining: find relations between data.
Romney Chapter 5 – Introduction to Data Analytics in Accounting
This chapter explores data analytics and accompanying toolsets needed to turn this mountain
of data into useful information.
Big data: Data sets characterized by huge amounts (volume) of frequently updated data
(velocity) in various formats (variety), for which the quality may be suspect (veracity).
- Data volume: The amount of data created and stored by an organization;
- Data velocity: The pace at which data is created and stored;
- Data variety: The different forms data can take;
- Data veracity: the quality and trustworthiness of data.
An analytical mindset is the ability to:
- Ask the right questions;
- Extract, transform and load relevant data;
- Apply appropriate data analytic techniques;
- Interpret and share the results with stakeholders.
Ask the right questions:
To define right or good questions in the context of data analytics, start by establishing
objectives that are SMART: Specific, Measurable, Achievable, Relevant and Timely.
, - Specific: Needs to be direct and focused to produce a meaningful answer;
- Measurable: Must be amenable to data analysis and thus the inputs to answering the
question must be measurable with data;
- Achievable: Should be able to be answered and the answer should cause a decision
maker to take an action;
- Relevant: Should relate to the objectives of the organization or the situation under
consideration;
- Timely: Must have defined time horizon for answering.
Extract, transform and load relevant data:
The process of extracting, transforming and loading data is often called the ETL process.
This is most often the most time consuming part of an analytical mindset, due to the process
being different each time, for each different program, database or system that stores or uses
data.
The AICPA developed a set of Audit Data Standards for guidance in this process.
Extracting data is the first step in the ETL process, which is done in three steps:
1. Understand data needs and the data availability:
- Define: First data needs to be properly defined. So the right questions have to be
asked. Defining the data will make it easier to define what data is needed to address
the question, if you don’t do this, the wrong data or incomplete data will be extracted.
- Understanding & organizing: Start understanding things like location, accessibility
and structure of the data. Companies often organize their data with data warehouses,
data marts and or data lakes.
Data warehouses: Collect massive data from multiple sources across the organization.
Data marts: Given the immense size of data warehouses, it is often more efficient to process
data in smaller data repositories holding structured data (So a data mart for Europe and one
for the USA).
Data lakes: A collection of structured, semi-structured and unstructured data stored in a
single location. Companies create data lakes to add all data in the organization to the data lake
as well as relevant data from outside the organization.
Dark data: Information the organization has collected and stored, that could be useful but is
not analysed and is thus ignored. Data swamps: Data repositories that are not accurately
documented so that the stored data cannot be properly identified and analysed. Metadata:
Data that describes other data.
See examples page 167 of different structures of data warehouses, marts and lakes.
2. Perform the data extraction;
With a firm understanding of data needs and the location and properties of the data, you are
prepared to extract the needed data. Organizations often have internal controls that restrict
access to different types of data. Data extraction may require receiving permission from the
data owner. With permission of the data owner, the data will then need to be extracted into
separate files or into a flat file. A flat file is a text file that contains data from multiple tables
or sources and merges that data into a single row. Delimiter and text qualifier page 168
, 3. Verify the data extraction quality and document what you have done.
At last the quality of the data needs to be verified and document what has been done. Batch
processing controls (chapter 13) can be useful to verify the quality of data. An additional
verification step is to reperform the data extraction for a sample of records and compare the
smaller data extract with the full data extract. The final data extraction best practice is
creating a new data dictionary containing all of the information about the fields in the data
extraction.
Transforming data
Standardizing, structuring and cleaning the data so that it is in the format needed for data
analysis is called the data transformation process. Chapter 6 discusses this in detail. The
transformation process consists of four steps:
1. Understand the data and the desired outcome;
2. Standardize, structure and clean the data;
3. Validate data quality and verify data meets data requirements;
4. Document the transformation process.
Loading data: Once the data has been structured and cleaned, it can be imported into
whatever tool is used for analysis.
Apply appropriate data analytic techniques
Data analytics fall into four categories:
1. Descriptive analytics: Information that results from the examination of data to
understand the past, answers the question: what happened?
Example: accounting ratios.
2. Diagnostic analytics: Information that attempts to determine causal relationships,
answers the question: Why did this happen?
Example: causal relationships.
3. Predictive analytics: Information that results from analyses that focus on predicting
the future, answers the question: What might happen in the future?
Example: forecasting future events like stock prices.
4. Prescriptive analytics: Information that results from analyses to provide a
recommendations of what should happen, answers the question: What should be done?
Example: creation of algorithms that predict, or take action.
Interpret and share the results with stakeholders
Interpreting results: interpreting requires human judgement. First misinterpretation:
Correlation is not causation. Correlation tells if two things happen at the same time.
Causation tells that the occurrence of one thing will cause the occurrence of a second thing. A
second common misinterpretation of results is systematic biases in the way people interpret
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller bajelsma. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for £5.62. You're not tied to anything after your purchase.