How to Decide Whether to Investigate a Potentially Underlying Problem

How to Decide Whether to Investigate a Potentially Underlying Problem

Systemic problems are complex. They are solved by an organization, using a particular set of techniques. That institution should apply the same of set of rules to software development it employs in other domains. Underlying issues in different areas have the same attributes they have in application construction. Yet, a company may not know whether it should investigate a fundamental issue in its programming operations. That business does not want to engage in a wild goose chase, so it avoids looking into the problem. Yet, an organization can identify a systemic difficulty, using a simple approach. That institution does not need to be sure it has an underlying complication. Rather, it only needs to verify it is not wasting its time. A company can be confident it is not spending its hours frivolously hunting a fundamental trouble in its software development activities, by using a collection of simple techniques built on generalized principles.

One concept behind those approaches is speed above absolute confidence. An organization only needs an assurance it is spending its time productively to do a search. It benefits by decreasing the likelihood it will engage in a foolish exploration. However, it does not gain much by guaranteeing an examination will absolutely yield a result. Every hunt involves risk. Moreover, an institution can always stop a search, if the probability of any windfall seems low. Therefore, a company should not invest more than necessary, in its decision to do an exploration. That judgement should be made, using techniques that allow a business to reach a verdict quickly.

That conclusion decides whether an organization is likely to waste its time. Yet, it might not know how to identify misuse. An instance of misapplication can be spotted through a broadly relevant principle. That concept might work counterintuitively to an institution. It might believe it must find a systemic problem, for it to deem its search unwasteful. Yet, an investigation can be fruitful, if it yields useful knowledge. An audit might not find an underlying issue, but it could discover helpful insights. If an examination yields an identifiable benefit, a company did not engage in misuse of its resources. Misapplication can be spotted by an organization, when a search does not yield a detectable gain. An institution can recognize waste, if it applies that general principle.

Broadly applicable concepts such as speed and identifiable benefits do not tell an institution what data to use or how to analyze it. A company needs a different guideline to answer those questions. The answers lie in adhering to a general rule: use a simple dataset scrutinized with basic techniques. Complex information can be time-consuming to interrogate effectively. Those collections are not always necessary. A barebones structure can be analyzed by a business more quickly. An organization can reach the same conclusion it would have attained with more sophisticated data and techniques. The pattern it seeks is so strong that the trend will emerge from almost any structure. A simple set of numbers can be analyzed faster, saving an institution time. That company can make a decision faster. Complex statistics and approaches only slow a business down. They are only helpful, when the pattern is subtle or absolute assurance is necessary. An organization does not require strict confidence to move forward, and it is not looking for deeply hidden trends. Therefore, an institution should use simple datasets probed with basic methods, to make its decision.

simple data set
Figure 1: Example of a simple data set

Yet, those approaches do not tell a company which elements of that structure to examine first. A business can study a rudimentary collection, from several angles. Yet, the aspect it should analyze first follows the principle of simplicity. For example, an organization can calculate its frequency of failed projects. That formula only requires long division. Its only demands two pieces of data: how many endeavors fall short and how many ventures exist in the set. Those numbers can be inspected, using a straightforward approach. If the rate of sputtering undertakings is a large percentage (e.g., 3 to 1, 4 to 1, 5 to 1, etc.), then an organization is unlikely to waste its time investigating a systemic problem. It can make a decision, without further effort. That judgment is made soundly, and it begins by focusing on a simple aspect to study.

However, rudimentary angles are not always effective. A more complex approach might be required by an institution. When a company faces a subtler issue, it should use an incrementally more sophisticated mode of analysis. For example, that business still examines the total number of projects that failed, but it replaces a single category with multiple ones (e.g., schedule, budget, and objective). Those classifications have calculated rates, just as they did before. If an organization finds a group has a high frequency (e.g., 3 to 1, 4 to 1, 5 to 1, etc.), then it is unlikely to spend its time on an audit frivolously. An examination should be conducted. That institution made its decision, by using an incrementally more complicated approach.

Nonetheless, that technique might not yield a result. An attack employing sophisticated statistical tools is required to make a judgement. To reach a conclusion, an organization needs a method guided by complexity. That mechanism utilizes whatever an institution has in its quantitative toolkit. That utility belt is wielded to detect clusters or randomness, in a company's software development activities. A business has a problem worthy of investigation, if the root of its issues exhibit consistency. Yet, it also has a difficulty needing examination, if the causes of its challenges demonstrate fickleness. Those characteristics require more sophisticated techniques than a person can execute, using basic math and rudimentary Excel functions. A complex toolkit is what an organization requires here. That institution is shepherded, in this situation, by intricacy, not simplicity.

random data
Figure 2: Example of random data

Yet, sophistication may not lead to a decision. The approach to finding one demands a company reevaluate its data. Those numbers are heavily dependent upon on what projects a business classifies as failure and the exact causes it gives for those missteps. Those defeats and their origins might not be a straightforward determinations. They could easily be categorized differently, without an organization being dishonest. If it distributed them in a different manner, then it might make another judgement. Therefore, an institution should reexamine its designations, and it should execute the process again, if it cannot make ruling. That company could reach a decision, if it adjusted its data and executed another cycle.

Several iterations could be required to draw a conclusion. A few laps should be sufficient for a business to render a judgement. If a handful of trips are insufficient, then that organization should make a decision not to pursue an investigation of its systemic issues. That search is likely to waste time. The audit's institution could find no evidence that it is likely learn anything useful. It could not find a high probability of failures, consistency in cause, or a complete lack of predictability in origin. Those elements are sound indicators that a company will benefit from an examination. If a business conducts a search in the absence of those signals, then it risks spending its resources frivolously. Therefore, an organization should make a decision to forgo an investigation, if it has not rendered a judgement, after a few cycles.

Each of those laps begins with simplicity and end with complexity. The steps in each of those turns move through a process guided by broadly applicable principles. The concepts relevant at certain stages shift. The later phases focus more on handling intricacy and less on dealing with straightforwardness. This approach's goal is speed rather than absolute confidence. It only employs sophisticated techniques, when they are necessary to make a decision. It endeavors only to conclude that a search is likely to yield an organization a beneficial result, not to be certain. That institution is not wasting its time, if an investigation yields useful insights. It can feel confident that an audit is economical, if it follows a process guided by the principles described here. How to conduct that examination and what to do with its results are described in future articles.

Follow ExperTech Insights on Twitter and subscribe to its Substack newsletter.