Evaluation – some general pointers
Generating information from data that can serve as a basis for decision-making seems more than simple at first glance.
However, on a more detailed examination, this insight changes very quickly. No matter how much data, numbers, facts you have – it is only a part of the theoretical total amount of ALL existing facts. Furthermore, information is always collected with the help of certain technologies and procedures or from already existing data stocks. The methodology of the collection, the perspective of the observation, but also personal or social conditions and intentions play a decisive role, both in the collection and in the evaluation. In the Dibelius sentence, „I believe only a statistic, which I falsified myself!“, is far more truth, than one is generally ready to accept.
Without assuming a motive or intention to falsify, it is important to take a close look at the survey procedures in order to exclude or compensate for possible errors, misclassifications, omissions and so on.
Only when the greatest possible freedom from errors has been established (and even this statement is a subjective one!) can an evaluation of the available data/information yield an approximately objective picture.
However, another point also plays a role in this consideration – that of economy.
Increasing accuracy, greater freedom from errors, and a more complex (larger) amount of information usually also require a steadily increasing volume of capacity and thus, in part, exponentially increasing costs.
It is only when all precautions have been taken to limit the possibilities or errors or omissions that an evaluation of the available data/information can yield an approximately objective picture. Another point for consideration, is that of economy. With more complex and larger amounts of information to be evaluated, there is a corresponding increase in the costs involved. All of these issues must be taken into consideration when evaluating data. In short – everyone should be aware that the results do not represent an ‘absolute’ truth but are always only a thorough assessment of the current reality.
The field of statistics as a subset of mathematics provides us with many excellent tools for interpreting data correctly, i.e., for peeling out the core of the puzzle. However, the larger the data sets become and the more diverse the facets that need to be considered are, and above all, the more different the weighting of the individual partial data sets is (both in terms of relevance and truthfulness), and the more difficult the evaluation becomes. Finally, the interpretation of data and information is ultimately the core of any analytical research.
What hidden information can be discovered in larger data sets? Do repeating patterns exist, what do they look like and what are their implications? What correlations exist between the clusters of data? Which predictions can be derived?
The answers to those questions are discovered by using special tools, which nowadays can also be found outside of statistics. Modern analytical approaches and increasing use of bots and AI, combined with past experiences and solutions is the path Arrows Consulting is taking with its highly efficient analysis tools to delivery those answers.
But storing (past) experiences and solutions and using them to find solutions in upcoming tasks faster is exactly the step in THAT direction. This is the path ARROWS is taking with its highly efficient analysis tools.